2,007 research outputs found

    What Makes a Computation Unconventional?

    Full text link
    A coherent mathematical overview of computation and its generalisations is described. This conceptual framework is sufficient to comfortably host a wide range of contemporary thinking on embodied computation and its models.Comment: Based on an invited lecture for the 'Symposium on Natural/Unconventional Computing and Its Philosophical Significance' at the AISB/IACAP World Congress 2012, University of Birmingham, July 2-6, 201

    The Computability-Theoretic Content of Emergence

    Get PDF
    In dealing with emergent phenomena, a common task is to identify useful descriptions of them in terms of the underlying atomic processes, and to extract enough computational content from these descriptions to enable predictions to be made. Generally, the underlying atomic processes are quite well understood, and (with important exceptions) captured by mathematics from which it is relatively easy to extract algorithmic con- tent. A widespread view is that the difficulty in describing transitions from algorithmic activity to the emergence associated with chaotic situations is a simple case of complexity outstripping computational resources and human ingenuity. Or, on the other hand, that phenomena transcending the standard Turing model of computation, if they exist, must necessarily lie outside the domain of classical computability theory. In this article we suggest that much of the current confusion arises from conceptual gaps and the lack of a suitably fundamental model within which to situate emergence. We examine the potential for placing emer- gent relations in a familiar context based on Turing's 1939 model for interactive computation over structures described in terms of reals. The explanatory power of this model is explored, formalising informal descrip- tions in terms of mathematical definability and invariance, and relating a range of basic scientific puzzles to results and intractable problems in computability theory

    Diffusion Controlled Reactions, Fluctuation Dominated Kinetics, and Living Cell Biochemistry

    Full text link
    In recent years considerable portion of the computer science community has focused its attention on understanding living cell biochemistry and efforts to understand such complication reaction environment have spread over wide front, ranging from systems biology approaches, through network analysis (motif identification) towards developing language and simulators for low level biochemical processes. Apart from simulation work, much of the efforts are directed to using mean field equations (equivalent to the equations of classical chemical kinetics) to address various problems (stability, robustness, sensitivity analysis, etc.). Rarely is the use of mean field equations questioned. This review will provide a brief overview of the situations when mean field equations fail and should not be used. These equations can be derived from the theory of diffusion controlled reactions, and emerge when assumption of perfect mixing is used

    Rule-based Modelling and Tunable Resolution

    Get PDF
    We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions

    Two-Domain DNA Strand Displacement

    Full text link
    We investigate the computing power of a restricted class of DNA strand displacement structures: those that are made of double strands with nicks (interruptions) in the top strand. To preserve this structural invariant, we impose restrictions on the single strands they interact with: we consider only two-domain single strands consisting of one toehold domain and one recognition domain. We study fork and join signal-processing gates based on these structures, and we show that these systems are amenable to formalization and to mechanical verification

    Turing Automata and Graph Machines

    Full text link
    Indexed monoidal algebras are introduced as an equivalent structure for self-dual compact closed categories, and a coherence theorem is proved for the category of such algebras. Turing automata and Turing graph machines are defined by generalizing the classical Turing machine concept, so that the collection of such machines becomes an indexed monoidal algebra. On the analogy of the von Neumann data-flow computer architecture, Turing graph machines are proposed as potentially reversible low-level universal computational devices, and a truly reversible molecular size hardware model is presented as an example
    corecore